Least-squares Probabilistic Classifier: a Computationally Efficient Alternative to Kernel Logistic Regression

نویسندگان

  • Masashi Sugiyama
  • Hirotaka Hachiya
  • Makoto Yamada
  • Jaak Simm
  • Hyunha Nam
چکیده

The least-squares probabilistic classifier (LSPC) is a computationally efficient alternative to kernel logistic regression (KLR). A key idea for the speedup is that, unlike KLR that uses maximum likelihood estimation for a log-linear model, LSPC uses least-squares estimation for a linear model. This allows us to obtain a global solution analytically in a classwise manner. In exchange for the speedup, however, this linear least-squares formulation does not necessarily produce a non-negative estimate. Nevertheless, consistency of LSPC is guaranteed in the large sample limit, and rounding up a negative estimate to zero in finite sample cases was demonstrated not to degrade the classification performance in experiments. Thus, LSPC is a practically useful probabilistic classifier. In this paper, we give an overview of LSPC and its extentions to covariate shift, multi-task, and multi-label scenarios. A MATLAB implementation of LSPC is available from ‘http://sugiyama-www.cs.titech.ac. jp/ ̃sugi/software/LSPC/’.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Improving the Accuracy of Least-Squares Probabilistic Classifiers

The least-squares probabilistic classifier (LSPC) is a computationally-efficient alternative to kernel logistic regression. However, to assure its learned probabilities to be non-negative, LSPC involves a post-processing step of rounding up negative parameters to zero, which can unexpectedly influence classification performance. In order to mitigate this problem, we propose a simple alternative...

متن کامل

Superfast-Trainable Multi-Class Probabilistic Classifier by Least-Squares Posterior Fitting

Kernel logistic regression (KLR) is a powerful and flexible classification algorithm, which possesses an ability to provide the confidence of class prediction. However, its training—typically carried out by (quasi-)Newton methods—is rather timeconsuming. In this paper, we propose an alternative probabilistic classification algorithm called Least-Squares Probabilistic Classifier (LSPC). KLR mode...

متن کامل

Computationally Efficient Multi-task Learning with Least-squares Probabilistic Classifiers

Probabilistic classification and multi-task learning are two important branches of machine learning research. Probabilistic classification is useful when the ‘confidence’ of decision is necessary. On the other hand, the idea of multi-task learning is beneficial if multiple related learning tasks exist. So far, kernelized logistic regression has been a vital probabilistic classifier for the use ...

متن کامل

Probabilistic Discriminative Kernel Classifiers for Multi-class Problems

Logistic regression is presumably the most popular representative of probabilistic discriminative classifiers. In this paper, a kernel variant of logistic regression is introduced as an iteratively re-weighted least-squares algorithm in kernel-induced feature spaces. This formulation allows us to apply highly efficient approximation methods that are capable of dealing with large-scale problems....

متن کامل

Fixed-size kernel logistic regression for phoneme classification

Kernel logistic regression (KLR) is a popular non-linear classification technique. Unlike an empirical risk minimization approach such as employed by Support Vector Machines (SVMs), KLR yields probabilistic outcomes based on a maximum likelihood argument which are particularly important in speech recognition. Different from other KLR implementations we use a Nyström approximation to solve large...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012